Uncertainty: An extra layer of security For Unauthorized traffic based Web Services

نویسندگان

  • Parag Agarwal
  • B. Prabhakaran
  • Bhavani Thuraisingham
چکیده

Distributed web services are under constant threat of being attacked from nodes, internal or external to the system. Internal attacks may result from hijacking of trusted web servers, resulting in loss/corruption of information, and Denial of Service (DoS) to clients. External attacks can occur from hijacking of trusted clients or malicious nodes leading to DoS to clients. The paper focuses on building an attack resistant framework for web services based on unauthorized traffic. Unauthorized traffic is a consequence of query driven session less HTTP-request/response message based web service applications, such as google.com. Dictionary.com etc. Unauthorized traffic based web service applications are supposed to have low response time. Unfortunately current mechanisms show lack of support for this traffic, since they add extra delay due to processing at intermediate nodes. The paper proposes a framework that optimizes the use of secure overlay services for unauthorized traffic. We add an extra layer of security around the web servers, which introduces uncertainty in the adversary’s actions and is achieved by introducing dummy servers to the existing system, which appear as real servers to the clients or adversaries. The dummy servers act as traps if an adversary attacks assuming them to be real servers. Secure strategies have been proposed to implement the dummy servers. These strategies reduce the risk of hijacking and DoS attacks, minimize the changes to external infrastructure, can be easily integrated with existing security systems, do not promote ISP collaboration, and helps in scaling the system.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Detecting Bot Networks Based On HTTP And TLS Traffic Analysis

Abstract— Bot networks are a serious threat to cyber security, whose destructive behavior affects network performance directly. Detecting of infected HTTP communications is a big challenge because infected HTTP connections are clearly merged with other types of HTTP traffic. Cybercriminals prefer to use the web as a communication environment to launch application layer attacks and secretly enga...

متن کامل

QoS-Based web service composition based on genetic algorithm

Quality of service (QoS) is an important issue in the design and management of web service composition. QoS in web services consists of various non-functional factors, such as execution cost, execution time, availability, successful execution rate, and security. In recent years, the number of available web services has proliferated, and then offered the same services increasingly. The same web ...

متن کامل

Anomaly-based Web Attack Detection: The Application of Deep Neural Network Seq2Seq With Attention Mechanism

Today, the use of the Internet and Internet sites has been an integrated part of the people’s lives, and most activities and important data are in the Internet websites. Thus, attempts to intrude into these websites have grown exponentially. Intrusion detection systems (IDS) of web attacks are an approach to protect users. But, these systems are suffering from such drawbacks as low accuracy in ...

متن کامل

Anomaly Base Network Intrusion Detection by Using Random Decision Tree and Random Projection: A Fast Network Intrusion Detection Technique

Network Intrusion Detection Systems (NIDSs) have become an important component in network security infrastructure. Currently, many NIDSs are rule-based systems whose performances highly depend on their rule sets. Unfortunately, due to the huge volume of network traffic, coding the rules by security experts becomes difficult and time-consuming. Since data mining techniques can build network intr...

متن کامل

Image flip CAPTCHA

The massive and automated access to Web resources through robots has made it essential for Web service providers to make some conclusion about whether the "user" is a human or a robot. A Human Interaction Proof (HIP) like Completely Automated Public Turing test to tell Computers and Humans Apart (CAPTCHA) offers a way to make such a distinction. CAPTCHA is a reverse Turing test used by Web serv...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006